Constructing Semantic Representations From a Gradually Changing Representation of Temporal Context

نویسندگان

  • Marc W. Howard
  • Karthik H. Shankar
  • Udaya K. K. Jagadisan
چکیده

Computational models of semantic memory exploit information about cooccurrences of words in naturally-occurring text to extract information about the meaning of the words that are present in the language. Such models implicitly specify a representation of temporal context. Depending on the model, words are said to have occurred in the same context if they are presented within a moving window, within the same sentence or within the same document. The temporal context model (TCM), a specific quantitative specification of temporal context has proved useful in the study of episodic memory. The predictive temporal context model (pTCM) uses the same definition of temporal context to generate semantic memory representations. Taken together pTCM and TCM may prove to be part of a general model of declarative memory.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Named Entity Recognition in Persian Text using Deep Learning

Named entities recognition is a fundamental task in the field of natural language processing. It is also known as a subset of information extraction. The process of recognizing named entities aims at finding proper nouns in the text and classifying them into predetermined classes such as names of people, organizations, and places. In this paper, we propose a named entity recognizer which benefi...

متن کامل

A Joint Semantic Vector Representation Model for Text Clustering and Classification

Text clustering and classification are two main tasks of text mining. Feature selection plays the key role in the quality of the clustering and classification results. Although word-based features such as term frequency-inverse document frequency (TF-IDF) vectors have been widely used in different applications, their shortcoming in capturing semantic concepts of text motivated researches to use...

متن کامل

Image Classification via Sparse Representation and Subspace Alignment

Image representation is a crucial problem in image processing where there exist many low-level representations of image, i.e., SIFT, HOG and so on. But there is a missing link across low-level and high-level semantic representations. In fact, traditional machine learning approaches, e.g., non-negative matrix factorization, sparse representation and principle component analysis are employed to d...

متن کامل

Word Type Effects on L2 Word Retrieval and Learning: Homonym versus Synonym Vocabulary Instruction

The purpose of this study was twofold: (a) to assess the retention of two word types (synonyms and homonyms) in the short term memory, and (b) to investigate the effect of these word types on word learning by asking learners to learn their Persian meanings. A total of 73 Iranian language learners studying English translation participated in the study. For the first purpose, 36 freshmen from an ...

متن کامل

From List Learning to Semantic Knowledge: Search and Learning of Associative Memory

Meanings—semantics—matter in the study of cognition. Recent advances in computational modeling have demonstrated how semantic information can be incorporated into episodic or experiential free recall tasks to better account for participant performance. However, one limitation of current uses of semantic representations with the SAM model is a disconnect between how the semantic representations ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Topics in cognitive science

دوره 3 1  شماره 

صفحات  -

تاریخ انتشار 2011